An augmented Lagrangian method with constraint generation for shape-constrained convex regression problems

نویسندگان

چکیده

Shape-constrained convex regression problem deals with fitting a function to the observed data, where additional constraints are imposed, such as component-wise monotonicity and uniform Lipschitz continuity. This paper provides unified framework for computing least squares estimator of multivariate shape-constrained in $${\mathbb {R}}^d$$ . We prove that is computable via solving an essentially constrained quadratic programming (QP) $$(d+1)n$$ variables, $$n(n-1)$$ linear inequality n possibly non-polyhedral constraints, number data points. To efficiently solve generally very large-scale QP, we design proximal augmented Lagrangian method (proxALM) whose subproblems solved by semismooth Newton method. further accelerate computation when huge, practical implementation constraint generation each reduced our proposed proxALM. Comprehensive numerical experiments, including those pricing basket options estimation production functions economics, demonstrate proxALM outperforms state-of-the-art algorithms, acceleration technique shortens time large margin.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Smoothing augmented Lagrangian method for nonsmooth constrained optimization problems

In this paper, we propose a smoothing augmented Lagrangian method for finding a stationary point of a nonsmooth and nonconvex optimization problem. We show that any accumulation point of the iteration sequence generated by the algorithm is a stationary point provided that the penalty parameters are bounded. Furthermore, we show that a weak version of the generalized Mangasarian Fromovitz constr...

متن کامل

A parallelizable augmented Lagrangian method applied to large-scale non-convex-constrained optimization problems

We contribute improvements to a Lagrangian dual solution approach applied to large-scale optimization problems whose objective functions are convex, continuously differentiable and possibly nonlinear, while the nonrelaxed constraint set is compact but not necessarily convex. Such problems arise, for example, in the split-variable deterministic reformulation of stochastic mixed-integer optimizat...

متن کامل

An Augmented Lagrangian Method for Conic Convex Programming

We propose a new first-order augmented Lagrangian algorithm ALCC for solving convex conic programs of the form min { ρ(x) + γ(x) : Ax− b ∈ K, x ∈ χ } , where ρ : Rn → R ∪ {+∞}, γ : Rn → R are closed, convex functions, and γ has a Lipschitz continuous gradient, A ∈ Rm×n, K ⊂ Rm is a closed convex cone, and χ ⊂ dom(ρ) is a “simple” convex compact set such that optimization problems of the form mi...

متن کامل

An adaptive augmented Lagrangian method for large-scale constrained optimization

We propose an augmented Lagrangian algorithm for solving large-scale constrained optimization problems. The novel feature of the algorithm is an adaptive update for the penalty parameter motivated by recently proposed techniques for exact penalty methods. This adaptive updating scheme greatly improves the overall performance of the algorithm without sacrificing the strengths of the core augment...

متن کامل

An augmented Lagrangian trust region method for equality constrained optimization

In this talk, we present a trust region method for solving equality constrained optimization problems, which is motivated by the famous augmented Lagrangian function. It is different from standard augmented Lagrangian methods where the augmented Lagrangian function is minimized at each iteration. This method, for fixed Lagrange multiplier and penalty parameters, tries to minimize an approximate...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Mathematical Programming Computation

سال: 2021

ISSN: ['1867-2957', '1867-2949']

DOI: https://doi.org/10.1007/s12532-021-00210-0